CDiNN – Convex difference neural networks

نویسندگان

چکیده

Neural networks with ReLU activation function have been shown to be universal approximators and learn mapping as non-smooth functions. Recently, there is considerable interest in the use of neural applications such optimal control. It well-known that optimization involving non-convex, functions are computationally intensive limited convergence guarantees. Moreover, choice hyper-parameters used gradient descent/ascent significantly affect quality obtained solutions. A new network architecture called Input Convex Networks (ICNNs) output a convex inputs thereby allowing efficient methods. Use ICNNs for determining input minimizing has two major problems: learning non-convex could result significant approximation error, we also note existing representations cannot capture simple dynamic structures like linear time delay systems. We attempt address above problems by introduction architecture, which call CDiNN, learns difference polyhedral from data. discuss that, some cases, can CDiNN through guarantees at each iteration, problem reduced programming problem.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Input Convex Neural Networks

This paper presents the input convex neural network architecture. These are scalar-valued (potentially deep) neural networks with constraints on the network parameters such that the output of the network is a convex function of (some of) the inputs. The networks allow for efficient inference via optimization over some inputs to the network given others, and can be applied to settings including ...

متن کامل

Convex Neural Networks

Convexity has recently received a lot of attention in the machine learning community, and the lack of convexity has been seen as a major disadvantage of many learning algorithms, such as multi-layer artificial neural networks. We show that training multi-layer neural networks in which the number of hidden units is learned can be viewed as a convex optimization problem. This problem involves an ...

متن کامل

rodbar dam slope stability analysis using neural networks

در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...

Automatic Crater Detection Using Convex Grouping and Convolutional Neural Networks

Craters are some the most important landmarks on the surface of many planets which can be used for autonomous safe landing and spacecraft and rover navigation. Manual detection of craters is laborious and impractical, and many approaches have been proposed in the field to automate this task. However, none of these methods have yet become a standard tool for crater detection due to the challengi...

متن کامل

Breaking the Curse of Dimensionality with Convex Neural Networks

We consider neural networks with a single hidden layer and non-decreasing positively homogeneous activation functions like the rectified linear units. By letting the number of hidden units grow unbounded and using classical non-Euclidean regularization tools on the output weights, they lead to a convex optimization problem and we provide a detailed theoretical analysis of their generalization p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2022.01.024